Generated for model: models/resnet50_prune40pct_best_model.pth
----------------------------------------------------------------
Layer (type) Output Shape Param #
================================================================
Conv2d-1 [-1, 64, 112, 112] 9,408
BatchNorm2d-2 [-1, 64, 112, 112] 128
MaxPool2d-3 [-1, 64, 56, 56] 0
Conv2d-4 [-1, 64, 56, 56] 4,096
BatchNorm2d-5 [-1, 64, 56, 56] 128
Conv2d-6 [-1, 64, 56, 56] 36,864
BatchNorm2d-7 [-1, 64, 56, 56] 128
Conv2d-8 [-1, 256, 56, 56] 16,384
BatchNorm2d-9 [-1, 256, 56, 56] 512
Conv2d-10 [-1, 256, 56, 56] 16,384
BatchNorm2d-11 [-1, 256, 56, 56] 512
Bottleneck-12 [-1, 256, 56, 56] 0
Conv2d-13 [-1, 64, 56, 56] 16,384
BatchNorm2d-14 [-1, 64, 56, 56] 128
Conv2d-15 [-1, 64, 56, 56] 36,864
BatchNorm2d-16 [-1, 64, 56, 56] 128
Conv2d-17 [-1, 256, 56, 56] 16,384
BatchNorm2d-18 [-1, 256, 56, 56] 512
Bottleneck-19 [-1, 256, 56, 56] 0
Conv2d-20 [-1, 64, 56, 56] 16,384
BatchNorm2d-21 [-1, 64, 56, 56] 128
Conv2d-22 [-1, 64, 56, 56] 36,864
BatchNorm2d-23 [-1, 64, 56, 56] 128
Conv2d-24 [-1, 256, 56, 56] 16,384
BatchNorm2d-25 [-1, 256, 56, 56] 512
Bottleneck-26 [-1, 256, 56, 56] 0
Conv2d-27 [-1, 128, 56, 56] 32,768
BatchNorm2d-28 [-1, 128, 56, 56] 256
Conv2d-29 [-1, 128, 28, 28] 147,456
BatchNorm2d-30 [-1, 128, 28, 28] 256
Conv2d-31 [-1, 512, 28, 28] 65,536
BatchNorm2d-32 [-1, 512, 28, 28] 1,024
Conv2d-33 [-1, 512, 28, 28] 131,072
BatchNorm2d-34 [-1, 512, 28, 28] 1,024
Bottleneck-35 [-1, 512, 28, 28] 0
Conv2d-36 [-1, 128, 28, 28] 65,536
BatchNorm2d-37 [-1, 128, 28, 28] 256
Conv2d-38 [-1, 128, 28, 28] 147,456
BatchNorm2d-39 [-1, 128, 28, 28] 256
Conv2d-40 [-1, 512, 28, 28] 65,536
BatchNorm2d-41 [-1, 512, 28, 28] 1,024
Bottleneck-42 [-1, 512, 28, 28] 0
Conv2d-43 [-1, 128, 28, 28] 65,536
BatchNorm2d-44 [-1, 128, 28, 28] 256
Conv2d-45 [-1, 128, 28, 28] 147,456
BatchNorm2d-46 [-1, 128, 28, 28] 256
Conv2d-47 [-1, 512, 28, 28] 65,536
BatchNorm2d-48 [-1, 512, 28, 28] 1,024
Bottleneck-49 [-1, 512, 28, 28] 0
Conv2d-50 [-1, 128, 28, 28] 65,536
BatchNorm2d-51 [-1, 128, 28, 28] 256
Conv2d-52 [-1, 128, 28, 28] 147,456
BatchNorm2d-53 [-1, 128, 28, 28] 256
Conv2d-54 [-1, 512, 28, 28] 65,536
BatchNorm2d-55 [-1, 512, 28, 28] 1,024
Bottleneck-56 [-1, 512, 28, 28] 0
Conv2d-57 [-1, 256, 28, 28] 131,072
BatchNorm2d-58 [-1, 256, 28, 28] 512
Conv2d-59 [-1, 256, 14, 14] 589,824
BatchNorm2d-60 [-1, 256, 14, 14] 512
Conv2d-61 [-1, 1024, 14, 14] 262,144
BatchNorm2d-62 [-1, 1024, 14, 14] 2,048
Conv2d-63 [-1, 1024, 14, 14] 524,288
BatchNorm2d-64 [-1, 1024, 14, 14] 2,048
Bottleneck-65 [-1, 1024, 14, 14] 0
Conv2d-66 [-1, 256, 14, 14] 262,144
BatchNorm2d-67 [-1, 256, 14, 14] 512
Conv2d-68 [-1, 256, 14, 14] 589,824
BatchNorm2d-69 [-1, 256, 14, 14] 512
Conv2d-70 [-1, 1024, 14, 14] 262,144
BatchNorm2d-71 [-1, 1024, 14, 14] 2,048
Bottleneck-72 [-1, 1024, 14, 14] 0
Conv2d-73 [-1, 256, 14, 14] 262,144
BatchNorm2d-74 [-1, 256, 14, 14] 512
Conv2d-75 [-1, 256, 14, 14] 589,824
BatchNorm2d-76 [-1, 256, 14, 14] 512
Conv2d-77 [-1, 1024, 14, 14] 262,144
BatchNorm2d-78 [-1, 1024, 14, 14] 2,048
Bottleneck-79 [-1, 1024, 14, 14] 0
Conv2d-80 [-1, 256, 14, 14] 262,144
BatchNorm2d-81 [-1, 256, 14, 14] 512
Conv2d-82 [-1, 256, 14, 14] 589,824
BatchNorm2d-83 [-1, 256, 14, 14] 512
Conv2d-84 [-1, 1024, 14, 14] 262,144
BatchNorm2d-85 [-1, 1024, 14, 14] 2,048
Bottleneck-86 [-1, 1024, 14, 14] 0
Conv2d-87 [-1, 256, 14, 14] 262,144
BatchNorm2d-88 [-1, 256, 14, 14] 512
Conv2d-89 [-1, 256, 14, 14] 589,824
BatchNorm2d-90 [-1, 256, 14, 14] 512
Conv2d-91 [-1, 1024, 14, 14] 262,144
BatchNorm2d-92 [-1, 1024, 14, 14] 2,048
Bottleneck-93 [-1, 1024, 14, 14] 0
Conv2d-94 [-1, 256, 14, 14] 262,144
BatchNorm2d-95 [-1, 256, 14, 14] 512
Conv2d-96 [-1, 256, 14, 14] 589,824
BatchNorm2d-97 [-1, 256, 14, 14] 512
Conv2d-98 [-1, 1024, 14, 14] 262,144
BatchNorm2d-99 [-1, 1024, 14, 14] 2,048
Bottleneck-100 [-1, 1024, 14, 14] 0
Conv2d-101 [-1, 512, 14, 14] 524,288
BatchNorm2d-102 [-1, 512, 14, 14] 1,024
Conv2d-103 [-1, 512, 7, 7] 2,359,296
BatchNorm2d-104 [-1, 512, 7, 7] 1,024
Conv2d-105 [-1, 2048, 7, 7] 1,048,576
BatchNorm2d-106 [-1, 2048, 7, 7] 4,096
Conv2d-107 [-1, 2048, 7, 7] 2,097,152
BatchNorm2d-108 [-1, 2048, 7, 7] 4,096
Bottleneck-109 [-1, 2048, 7, 7] 0
Conv2d-110 [-1, 512, 7, 7] 1,048,576
BatchNorm2d-111 [-1, 512, 7, 7] 1,024
Conv2d-112 [-1, 512, 7, 7] 2,359,296
BatchNorm2d-113 [-1, 512, 7, 7] 1,024
Conv2d-114 [-1, 2048, 7, 7] 1,048,576
BatchNorm2d-115 [-1, 2048, 7, 7] 4,096
Bottleneck-116 [-1, 2048, 7, 7] 0
Conv2d-117 [-1, 512, 7, 7] 1,048,576
BatchNorm2d-118 [-1, 512, 7, 7] 1,024
Conv2d-119 [-1, 512, 7, 7] 2,359,296
BatchNorm2d-120 [-1, 512, 7, 7] 1,024
Conv2d-121 [-1, 2048, 7, 7] 1,048,576
BatchNorm2d-122 [-1, 2048, 7, 7] 4,096
Bottleneck-123 [-1, 2048, 7, 7] 0
AdaptiveAvgPool2d-124 [-1, 2048, 1, 1] 0
Linear-125 [-1, 1] 2,049
================================================================
Total params: 23,510,081
Trainable params: 23,510,081
Non-trainable params: 0
----------------------------------------------------------------
Input size (MB): 0.57
Forward/backward pass size (MB): 213.24
Params size (MB): 89.68
Estimated Total Size (MB): 303.50
----------------------------------------------------------------
======================================================== --- Custom Model Pruning Summary (Sparsity Analysis) --- ======================================================== Layer Name | Total Params | Non-Zero Params | Sparsity (%) ------------------------------------------------------------------------------------------------ conv1 | 9,408 | 7,920 | 15.82% layer1.0.conv1 | 4,096 | 3,628 | 11.43% layer1.0.conv2 | 36,864 | 25,387 | 31.13% layer1.0.conv3 | 16,384 | 14,514 | 11.41% layer1.0.shortcut.0 | 16,384 | 14,514 | 11.41% layer1.1.conv1 | 16,384 | 12,743 | 22.22% layer1.1.conv2 | 36,864 | 25,443 | 30.98% layer1.1.conv3 | 16,384 | 14,504 | 11.47% layer1.2.conv1 | 16,384 | 12,778 | 22.01% layer1.2.conv2 | 36,864 | 25,359 | 31.21% layer1.2.conv3 | 16,384 | 14,467 | 11.70% layer2.0.conv1 | 32,768 | 25,592 | 21.90% layer2.0.conv2 | 147,456 | 90,198 | 38.83% layer2.0.conv3 | 65,536 | 55,103 | 15.92% layer2.0.shortcut.0 | 131,072 | 101,587 | 22.50% layer2.1.conv1 | 65,536 | 46,153 | 29.58% layer2.1.conv2 | 147,456 | 90,450 | 38.66% layer2.1.conv3 | 65,536 | 55,130 | 15.88% layer2.2.conv1 | 65,536 | 46,220 | 29.47% layer2.2.conv2 | 147,456 | 90,652 | 38.52% layer2.2.conv3 | 65,536 | 55,070 | 15.97% layer2.3.conv1 | 65,536 | 46,105 | 29.65% layer2.3.conv2 | 147,456 | 90,640 | 38.53% layer2.3.conv3 | 65,536 | 55,296 | 15.62% layer3.0.conv1 | 131,072 | 92,347 | 29.54% layer3.0.conv2 | 589,824 | 330,295 | 44.00% layer3.0.conv3 | 262,144 | 204,248 | 22.09% layer3.0.shortcut.0 | 524,288 | 365,512 | 30.28% layer3.1.conv1 | 262,144 | 164,581 | 37.22% layer3.1.conv2 | 589,824 | 330,128 | 44.03% layer3.1.conv3 | 262,144 | 204,418 | 22.02% layer3.2.conv1 | 262,144 | 163,718 | 37.55% layer3.2.conv2 | 589,824 | 327,500 | 44.47% layer3.2.conv3 | 262,144 | 204,938 | 21.82% layer3.3.conv1 | 262,144 | 163,251 | 37.72% layer3.3.conv2 | 589,824 | 326,478 | 44.65% layer3.3.conv3 | 262,144 | 204,826 | 21.87% layer3.4.conv1 | 262,144 | 162,914 | 37.85% layer3.4.conv2 | 589,824 | 324,308 | 45.02% layer3.4.conv3 | 262,144 | 205,076 | 21.77% layer3.5.conv1 | 262,144 | 162,228 | 38.11% layer3.5.conv2 | 589,824 | 322,094 | 45.39% layer3.5.conv3 | 262,144 | 204,413 | 22.02% layer4.0.conv1 | 524,288 | 324,226 | 38.16% layer4.0.conv2 | 2,359,296 | 1,202,060 | 49.05% layer4.0.conv3 | 1,048,576 | 733,898 | 30.01% layer4.0.shortcut.0 | 2,097,152 | 1,290,734 | 38.45% layer4.1.conv1 | 1,048,576 | 584,341 | 44.27% layer4.1.conv2 | 2,359,296 | 1,206,573 | 48.86% layer4.1.conv3 | 1,048,576 | 732,732 | 30.12% layer4.2.conv1 | 1,048,576 | 585,692 | 44.14% layer4.2.conv2 | 2,359,296 | 1,200,440 | 49.12% layer4.2.conv3 | 1,048,576 | 729,516 | 30.43% linear | 2,048 | 1,238 | 39.55% ------------------------------------------------------------------------------------------------ TOTAL PRUNABLE | 23,456,960 | 14,074,176 | 40.00% ========================================================
Heatmap from weight matrix for sample layer. White pixel represents pruned (zeroed) weights.